Goto

Collaborating Authors

 conditional function


Extrapolation-Aware Nonparametric Statistical Inference

Pfister, Niklas, Bühlmann, Peter

arXiv.org Machine Learning

We define extrapolation as any type of statistical inference on a conditional function (e.g., a conditional expectation or conditional quantile) evaluated outside of the support of the conditioning variable. This type of extrapolation occurs in many data analysis applications and can invalidate the resulting conclusions if not taken into account. While extrapolating is straightforward in parametric models, it becomes challenging in nonparametric models. In this work, we extend the nonparametric statistical model to explicitly allow for extrapolation and introduce a class of extrapolation assumptions that can be combined with existing inference techniques to draw extrapolation-aware conclusions. The proposed class of extrapolation assumptions stipulate that the conditional function attains its minimal and maximal directional derivative, in each direction, within the observed support. We illustrate how the framework applies to several statistical applications including prediction and uncertainty quantification. We furthermore propose a consistent estimation procedure that can be used to adjust existing nonparametric estimates to account for extrapolation by providing lower and upper extrapolation bounds. The procedure is empirically evaluated on both simulated and real-world data.


Local Boosting for Weakly-Supervised Learning

Zhang, Rongzhi, Yu, Yue, Shen, Jiaming, Cui, Xiquan, Zhang, Chao

arXiv.org Artificial Intelligence

Boosting is a commonly used technique to enhance the performance of a set of base models by combining them into a strong ensemble model. Though widely adopted, boosting is typically used in supervised learning where the data is labeled accurately. However, in weakly supervised learning, where most of the data is labeled through weak and noisy sources, it remains nontrivial to design effective boosting approaches. In this work, we show that the standard implementation of the convex combination of base learners can hardly work due to the presence of noisy labels. Instead, we propose $\textit{LocalBoost}$, a novel framework for weakly-supervised boosting. LocalBoost iteratively boosts the ensemble model from two dimensions, i.e., intra-source and inter-source. The intra-source boosting introduces locality to the base learners and enables each base learner to focus on a particular feature regime by training new base learners on granularity-varying error regions. For the inter-source boosting, we leverage a conditional function to indicate the weak source where the sample is more likely to appear. To account for the weak labels, we further design an estimate-then-modify approach to compute the model weights. Experiments on seven datasets show that our method significantly outperforms vanilla boosting methods and other weakly-supervised methods.


Updating with Belief Functions, Ordinal Conditioning Functions and Possibility Measures

Dubois, Didier, Prade, Henri

arXiv.org Artificial Intelligence

This paper discusses how a measure of uncertainty representing a state of knowledge can be updated when a new information, which may be pervaded with uncertainty, becomes available. This problem is considered in various framework, namely: Shafer's evidence theory, Zadeh's possibility theory, Spohn's theory of epistemic states. In the two first cases, analogues of Jeffrey's rule of conditioning are introduced and discussed. The relations between Spohn's model and possibility theory are emphasized and Spohn's updating rule is contrasted with the Jeffrey-like rule of conditioning in possibility theory. Recent results by Shenoy on the combination of ordinal conditional functions are reinterpreted in the language of possibility theory. It is shown that Shenoy's combination rule has a well-known possibilistic counterpart.


A Default Logical Semantics for Defeasible Argumentation

Kern-Isberner, Gabriele (Technische Universitaet Dortmund) | Simari, Guillermo R (Universidad Nacional del Sur, Argentina)

AAAI Conferences

Defeasible argumentation and default reasoning are usually perceived as two similar, but distinct approaches to commonsense reasoning. In this paper, we combine these two fields by viewing (defeasible resp. default) rules as a common crucial part in both areas. We will make use of possible worlds semantics from default reasoning to provide examples for arguments, and carry over the notion of plausibility to the argumentative framework. Moreover, we base a priority relation between arguments on the tolerance partitioning of system Z and obtain a criterion phrased in system Z terms that ensures warrancy in defeasible argumentation.


Mining Default Rules from Statistical Data

Kern-Isberner, Gabriele (Technische Universität Dortmund) | Thimm, Matthias (Technische Universität Dortmund) | Finthammer, Marc (FernUniversität in Hagen) | Fisseler, Jens (FernUniversität in Hagen)

AAAI Conferences

In this paper, we are interested in the qualitative knowledge that underlies some given probabilistic information. To represent such qualitative structures, we use ordinal conditional functions, OCFs, (or ranking functions) as a qualitative abstraction of probability functions. The basic idea for transforming probabilities into ordinal rankings is to find well-behaved clusterings of the negative logarithms of the probabilities. We show how popular clustering tools can be used for this, and propose measures for the evaluation of the clustering results in this context. From the so obtained ranking functions, we extract conditionals that may serve as a base for inductive default reasoning.